Continuum directions for supervised dimension reduction
نویسنده
چکیده
We consider dimension reduction of multivariate data under the existence of various types of auxiliary information. We propose a criterion that provides a series of orthogonal directional vectors, that form a basis for dimension reduction. The proposed method can be thought of as an extension from the continuum regression, and the resulting basis is called continuum directions. We show that these directions continuously bridge the principal component, mean difference and linear discriminant directions, thus ranging from unsupervised to fully supervised dimension reduction. With a presence of binary supervision data, the proposed directions can be directly used for a two-group classification. Numerical studies show that the proposed method works well in high-dimensional settings where the variance of the first principal component is much larger than the rest. Primary 60K35, 60K35; secondary 60K35.
منابع مشابه
Gradient-based kernel dimension reduction for supervised learning
This paper proposes a novel kernel approach to linear dimension reduction for supervised learning. The purpose of the dimension reduction is to find directions in the input space to explain the output as effectively as possible. The proposed method uses an estimator for the gradient of regression function, based on the covariance operators on reproducing kernel Hilbert spaces. In comparison wit...
متن کاملIterative sliced inverse regression for segmentation of ultrasound and MR images
In this study, we propose an integrated approach based on iterative sliced inverse regression (ISIR) for the segmentation of ultrasound and magnetic resonance (MR) images. The approach integrates two stages. The first is the unsupervised clustering which combines multidimensional scaling (MDS) with K-Means. The dimension reduction based on MDS is employed to obtain fewer representative variates...
متن کاملSome Tools for Linear Dimension Reduction
Dimension reduction refers to a family of methods commonly used in multivariate statistical analysis. The common objective for all dimension reduction methods is essentially the same: the reducing of the number of variables in the data while still preserving their information content, however it is measured. In linear dimension reduction this is done by replacing the original variables with a l...
متن کاملAnalysis of Correlation Based Dimension Reduction Methods
Dimension reduction is an important topic in data mining and machine learning. Especially dimension reduction combined with feature fusion is an effective preprocessing step when the data are described by multiple feature sets. Canonical Correlation Analysis (CCA) and Discriminative Canonical Correlation Analysis (DCCA) are feature fusion methods based on correlation. However, they are differen...
متن کاملDeep Bottleneck Classifiers in Supervised Dimension Reduction
Deep autoencoder networks have successfully been applied in unsupervised dimension reduction. The autoencoder has a "bottleneck" middle layer of only a few hidden units, which gives a low dimensional representation for the data when the full network is trained to minimize reconstruction error. We propose using a deep bottlenecked neural network in supervised dimension reduction. Instead of tryi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1606.05988 شماره
صفحات -
تاریخ انتشار 2016